61 research outputs found

    An Annotated Corpus for Machine Reading of Instructions in Wet Lab Protocols

    Full text link
    We describe an effort to annotate a corpus of natural language instructions consisting of 622 wet lab protocols to facilitate automatic or semi-automatic conversion of protocols into a machine-readable format and benefit biological research. Experimental results demonstrate the utility of our corpus for developing machine learning approaches to shallow semantic parsing of instructional texts. We make our annotated Wet Lab Protocol Corpus available to the research community

    Medical Estimating PF Machine Learning and IoT in Melancholy among Diabetic Patients

    Get PDF
    To break down the frequency and related risk elements of sorrow in patients with type 2 diabetes mellitus in the nearby local area and give logical references to clinical anticipation and treatment of diabetes mellitus with wretchedness. Proposed strategies use AI more than 58 patients with type 2 diabetes mellitus were chosen by efficient inspecting, and pertinent surveys examined segment factors and related clinical variables and misery sub-scale (PHQ) was utilized to assess the level of sadness. Social help scale was utilized to survey the patients. SSRS are utilized to assess individual social help levels and lead measurable investigation. After assessment it's seen that among the 58 patients with type 2 diabetes, the rate of consolidated melancholy was 58%; the age, conjugal status, training level, occupation, family ancestry, term of diabetes, entanglements, work out. The distinction in friendly help was genuinely critical. The impacting elements of type 2 diabetes confounded with gloom incorporate age, conjugal status, training level, occupation, and family ancestry, span of diabetes, presence or nonappearance of confusions, exercise and social help. They have a high gamble of muddled sorrow and influence the improvement of diabetes

    Neural Decoder for Topological Codes using Pseudo-Inverse of Parity Check Matrix

    Full text link
    Recent developments in the field of deep learning have motivated many researchers to apply these methods to problems in quantum information. Torlai and Melko first proposed a decoder for surface codes based on neural networks. Since then, many other researchers have applied neural networks to study a variety of problems in the context of decoding. An important development in this regard was due to Varsamopoulos et al. who proposed a two-step decoder using neural networks. Subsequent work of Maskara et al. used the same concept for decoding for various noise models. We propose a similar two-step neural decoder using inverse parity-check matrix for topological color codes. We show that it outperforms the state-of-the-art performance of non-neural decoders for independent Pauli errors noise model on a 2D hexagonal color code. Our final decoder is independent of the noise model and achieves a threshold of 10%10 \%. Our result is comparable to the recent work on neural decoder for quantum error correction by Maskara et al.. It appears that our decoder has significant advantages with respect to training cost and complexity of the network for higher lengths when compared to that of Maskara et al.. Our proposed method can also be extended to arbitrary dimension and other stabilizer codes.Comment: 12 pages, 12 figures, 2 tables, submitted to the 2019 IEEE International Symposium on Information Theor

    Characterization of light weight composite proppants

    Get PDF
    The research objectives are to develop experimental and computational techniques to characterize and to study the influence of polymer coating on the mechanical response of walnut shell particles to be used as proppants. E3-ESEM and Zeiss Axiophot LM are used to study the cellular microstructure and feasibility of polymer infiltration and uniform coating. Three main testing procedures; single particle compression, heating tests on coated and uncoated walnut shell particles and 3-point flexure tests are undertaken. In in-situ ESEM observations on both the coated and uncoated particles showed signs of charring at about 175 – 200 ºC. Single particle compression test are conducted with random geometry particles and subsequently with four distinct shape categories to minimize the statistical scatter; flat top, round top, cone top, and high aspect ratio. Single particle tests on uniformly cut cuboid particles from walnut shell flakes are used to capture the nonlinear material response. Furthermore cyclic compression loads are imposed on flat top particles which reveal that significant permanent deformation set in even at low load levels. Computational models include Hertzian representation, 2D and 3D finite element models to simulate single coated and uncoated particles under compression. The elastic material with geometric nonlinear representation is not able to simulate the compression response observed during testing. The inelastic material representation is able to significantly improve the compression response and address the influence of geometric shape on particle response. A single uniform layer of polymer coat is introduced on the 3D models with nonlinear material definition. Coating provides a marginal improvement in load vs displacement response of the particles while increasing the ability of the particle to withstand higher loads

    A Survey Paper on Satellite Image Using OpenCV Library over Hadoop Framework

    Get PDF
    In this survey paper, we tend to study land classification from two-dimensional high resolution satellite pictures victimization Hadoop framework. Propelled picture process calculations that need higher system control with monstrous scale inputs is prepared with productivity exploitation the parallel and dispersed procedure of HadoopMapReduce Framework. HadoopMapReduce could be a climbable model that is fit for process petabytes information with enhanced adaptation to non-critical failure and information closeness. During this paper we tend to gift a MapReduce framework for acting parallel remote sensing satellite processing victimization Hadoop and storing the output in HBase. The speed and performance show that by utilizing Hadoop, we are able to distribute our employment across completely different clusters to require advantage of combined process power on goods hardware

    Efficient Tree-Traversals: Reconciling Parallelism and Dense Data Representations

    Get PDF
    Recent work showed that compiling functional programs to use dense, serialized memory representations for recursive algebraic datatypes can yield significant constant-factor speedups for sequential programs. But serializing data in a maximally dense format consequently serializes the processing of that data, yielding a tension between density and parallelism. This paper shows that a disciplined, practical compromise is possible. We present Parallel Gibbon, a compiler that obtains the benefits of dense data formats and parallelism. We formalize the semantics of the parallel location calculus underpinning this novel implementation strategy, and show that it is type-safe. Parallel Gibbon exceeds the parallel performance of existing compilers for purely functional programs that use recursive algebraic datatypes, including, notably, abstract-syntax-tree traversals as in compilers
    • …
    corecore